Lecture 2: Multiplicative Weights and Mirror Descent
نویسنده
چکیده
In the last lecture, we considered thematrix scaling problem: Given non-negativematrices X, T ∈ n×n + , our goal was to find non-negative diagonal matrices D1 ,D2 so that D1XD2 had the same row and column sums as the target matrix T. In other words, we sought to weight the rows and columns of X by positive numbers in order to achieve this. We used entropy optimization to prove the following theorem.
منابع مشابه
MS&E 336 Lecture 11: The multiplicative weights algorithm
This lecture is based on the corresponding paper of Freund and Schapire [2], though with some differences in notation and analysis. We introduce and study the multiplicative weights (MW) algorithm, which is an external regret minimizing (i.e., Hannan consistent) algorithm for playing a game. The same algorithm has been analyzed in various forms, particularly in the study of online learning; see...
متن کاملCS261: A Second Course in Algorithms Lecture #12: Applications of Multiplicative Weights to Games and Linear Programs∗
1 Extensions of the Multiplicative Weights Guarantee Last lecture we introduced the multiplicative weights algorithm for online decision-making. You don't need to remember the algorithm details for this lecture, but you should remember that it's a simple and natural algorithm (just one simple update per action per time step). You should also remember its regret guarantee, which we proved last l...
متن کاملGeneralization Error Bounds for Aggregation by Mirror Descent with Averaging
We consider the problem of constructing an aggregated estimator from a finite class of base functions which approximately minimizes a convex risk functional under the l1 constraint. For this purpose, we propose a stochastic procedure, the mirror descent, which performs gradient descent in the dual space. The generated estimates are additionally averaged in a recursive fashion with specific weig...
متن کاملLecture 10: Applications of Multiplicative Weight Updates: Lp Solving, Portfolio Management
Today we see how to use the multiplicative weight update method to solve other problems. In many settings there is a natural way to make local improvements that “make sense.”The multiplicative weight updates analysis from last time (via a simple potential function) allows us to understand and analyse the net effect of such sensible improvements. (Formally, what we are doing in many settings is ...
متن کاملLecture 3: Online Mirror Descent and Density Approximation
Let’s attempt to rephrase what we did last time in a more general setting. The idea is to view our algorithm as a sort of “regularized” local improvement algorithm. One should consult [Ch. 4, Bubeck, 2014] and [Ch. 5, Hazan, 2015] (and the references therein) for further information about online mirror descent and related algorithms coming from the convex optimization and machine learning. Our ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2016